翻訳と辞書
Words near each other
・ Conditional cash transfer
・ Conditional change model
・ Conditional comment
・ Conditional compilation
・ Conditional convergence
・ Conditional dependence
・ Conditional disjunction
・ Conditional dismissal
・ Conditional election
・ Conditional entropy
・ Conditional event algebra
・ Conditional expectation
・ Conditional factor demands
・ Conditional fee
・ Conditional gene knockout
Conditional independence
・ Conditional jockey
・ Conditional limitation
・ Conditional logistic regression
・ Conditional loop
・ Conditional mood
・ Conditional mutual information
・ Conditional noble
・ Conditional operator
・ Conditional perfect
・ Conditional preservation of the saints
・ Conditional probability
・ Conditional probability distribution
・ Conditional probability table
・ Conditional proof


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Conditional independence : ウィキペディア英語版
Conditional independence

In probability theory, two events ''R'' and ''B'' are conditionally independent given a third event ''Y'' precisely if the occurrence or non-occurrence of ''R'' ''and'' the occurrence or non-occurrence of ''B'' are independent events in their conditional probability distribution given ''Y''. In other words, ''R'' and ''B'' are conditionally independent given ''Y'' if and only if, given knowledge that ''Y'' occurs, knowledge of whether ''R'' occurs provides no information on the likelihood of ''B'' occurring, and knowledge of whether ''B'' occurs provides no information on the likelihood of ''R'' occurring.
== Formal definition ==

In the standard notation of probability theory, ''R'' and ''B'' are conditionally independent given ''Y'' if and only if
:\Pr(R \cap B \mid Y) = \Pr(R \mid Y)\Pr(B \mid Y),\,
or equivalently,
:\Pr(R \mid B \cap Y) = \Pr(R \mid Y).\,
Two random variables ''X'' and ''Y'' are conditionally independent given a third random variable ''Z'' if and only if they are independent in their conditional probability distribution given ''Z''. That is, ''X'' and ''Y'' are conditionally independent given ''Z'' if and only if, given any value of ''Z'', the probability distribution of ''X'' is the same for all values of ''Y'' and the probability distribution of ''Y'' is the same for all values of ''X''.
Two events ''R'' and ''B'' are conditionally independent given a σ-algebra Σ if
:\Pr(R \cap B \mid \Sigma) = \Pr(R \mid \Sigma)\Pr(B \mid \Sigma)\ a.s.
where \Pr(A \mid \Sigma) denotes the conditional expectation of the indicator function of the event A, \chi_A, given the sigma algebra \Sigma. That is,
:\Pr(A \mid \Sigma) := \operatorname().
Two random variables ''X'' and ''Y'' are conditionally independent given a σ-algebra ''Σ'' if the above equation holds for all ''R'' in σ(''X'') and B in σ(''Y'').
Two random variables ''X'' and ''Y'' are conditionally independent given a random variable ''W'' if they are independent given σ(''W''): the σ-algebra generated by ''W''. This is commonly written:
:X \perp\!\!\!\perp Y \mid W or
:X \perp Y \mid W
This is read "X is independent of Y, given W"; the conditioning applies to the whole statement: "(X is independent of Y) given W".
:(X \perp\!\!\!\perp Y) \mid W
If ''W'' assumes a countable set of values, this is equivalent to the conditional independence of ''X'' and ''Y'' for the events of the form ().
Conditional independence of more than two events, or of more than two random variables, is defined analogously.
The following two examples show that ''X'' (unicode:⊥) ''Y''
''neither implies nor is implied by'' ''X'' (unicode:⊥) ''Y'' | ''W''.
First, suppose ''W'' is 0 with probability 0.5 and 1 otherwise. When
''W'' = 0 take ''X'' and ''Y'' to be independent, each having the value 0 with probability 0.99 and the value 1 otherwise. When ''W'' = 1, ''X'' and ''Y'' are again independent, but this time they take the value 1
with probability 0.99. Then ''X'' (unicode:⊥) ''Y'' | ''W''. But ''X'' and ''Y'' are dependent, because Pr(''X'' = 0) < Pr(''X'' = 0|''Y'' = 0). This is because Pr(''X'' = 0) = 0.5, but if ''Y'' = 0 then it's very likely that ''W'' = 0 and thus that ''X'' = 0 as well, so Pr(''X'' = 0|''Y'' = 0) > 0.5. For the second example, suppose ''X'' (unicode:⊥) ''Y'', each taking the values 0 and 1 with probability 0.5. Let ''W'' be the product ''X''(unicode:×)''Y''. Then when ''W'' = 0, Pr(''X'' = 0) = 2/3, but Pr(''X'' = 0|''Y'' = 0) = 1/2, so ''X'' (unicode:⊥) ''Y'' | ''W'' is false.
This is also an example of Explaining Away. See Kevin Murphy's tutorial
〔http://people.cs.ubc.ca/~murphyk/Bayes/bnintro.html〕
where ''X'' and ''Y'' take the values "brainy" and "sporty".

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Conditional independence」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.